Random Feature Expansions for Deep Gaussian Processes

نویسندگان

  • Kurt Cutajar
  • Edwin V. Bonilla
  • Pietro Michiardi
  • Maurizio Filippone
چکیده

The composition of multiple Gaussian Processes as a Deep Gaussian Process (DGP) enables a deep probabilistic nonparametric approach to flexibly tackle complex machine learning problems with sound quantification of uncertainty. Existing inference approaches for DGP models have limited scalability and are notoriously cumbersome to construct. In this work we introduce a novel formulation of DGPs based on random feature expansions that we train using stochastic variational inference. This yields a practical learning framework which significantly advances the state-of-the-art in inference for DGPs, and enables accurate quantification of uncertainty. We extensively showcase the scalability and performance of our proposal on several datasets with up to 8million observations, and various DGP architectures with up to 30 hidden layers.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Supplementary Material Random Feature Expansions for Deep Gaussian Processes

Random Feature Expansions for Deep Gaussian Processes Kurt Cutajar 1 Edwin V. Bonilla 2 Pietro Michiardi 1 Maurizio Filippone 1 A. Additional Experiments Using the experimental set-up described in Section 4, Figure 1 demonstrates how the competing models perform with regards to the RMSE (or error rate) and MNLL metric when two hidden layers are incorporated into the competing models. The result...

متن کامل

Expansions for Gaussian processes and Parseval frames

We derive a precise link between series expansions of Gaussian random vectors in a Banach space and Parseval frames in their reproducing kernel Hilbert space. The results are applied to pathwise continuous Gaussian processes and a new optimal expansion for fractional OrnsteinUhlenbeck processes is derived.

متن کامل

Complete convergence of moving-average processes under negative dependence sub-Gaussian assumptions

The complete convergence is investigated for moving-average processes of doubly infinite sequence of negative dependence sub-gaussian random variables with zero means, finite variances and absolutely summable coefficients. As a corollary, the rate of complete convergence is obtained under some suitable conditions on the coefficients.

متن کامل

Karhunen–Loève expansion for multi-correlated stochastic processes

We propose two different approaches generalizing the Karhunen–Loève series expansion to model and simulate multi-correlated non-stationary stochastic processes. The first approach (muKL) is based on the spectral analysis of a suitable assembled stochastic process and yields series expansions in terms of an identical set of uncorrelated random variables. The second approach (mcKL) relies on expa...

متن کامل

Accelerating Deep Gaussian Processes Inference with Arc-Cosine Kernels

Deep Gaussian Processes (DGPs) are probabilistic deep models obtained by stacking multiple layers implemented through Gaussian Processes (GPs). Although attractive from a theoretical point of view, learning DGPs poses some significant computational challenges that arguably hinder their application to a wider variety of problems for which Deep Neural Networks (DNNs) are the preferred choice. We ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017